Skip to content

Conversation

@xenova
Copy link
Collaborator

@xenova xenova commented Apr 1, 2025

Fixes #1253, allowing local models to be loaded via either env.localModelPath or a valid path_or_repo_id that is a local path (e.g., ./path/to/model). This better aligns with the python library's usage.

In other words, with this directory structure,

model-dir/
└── distilbert-base-uncased-finetuned-sst-2-english/
    ├── config.json
    ├── tokenizer.json
    ├── tokenizer_config.json
    └── onnx/
        └── model.onnx

the following all now work:

1. Loading from env.localModelPath

import { env, pipeline } from '@huggingface/transformers';
env.allowRemoteModels = false; // just for testing

env.localModelPath = "./model-dir/"
const pipe = await pipeline("text-classification", "distilbert-base-uncased-finetuned-sst-2-english");
const output = await pipe("Hello world!");

2. Loading via a path

import { env, pipeline } from '@huggingface/transformers';
env.allowRemoteModels = false; // just for testing

const pipe = await pipeline("text-classification", "./model-dir/distilbert-base-uncased-finetuned-sst-2-english");
const output = await pipe("Hello world!");

3. Loading from hub & caching

import { pipeline } from '@huggingface/transformers';
const pipe = await pipeline("text-classification", "Xenova/distilbert-base-uncased-finetuned-sst-2-english");
const output = await pipe("Hello world!");

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@xenova xenova merged commit 4968f0f into main Apr 2, 2025
4 checks passed
@xenova xenova deleted the fix-local-path branch April 2, 2025 00:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3.4.0 version of transformer does not respect env.localPath

3 participants